Graph Attention Networks with Positional Embeddings

نویسندگان

چکیده

Graph Neural Networks (GNNs) are deep learning methods which provide the current state of art performance in node classification tasks. GNNs often assume homophily – neighboring nodes having similar features and labels–, therefore may not be at their full potential when dealing with non-homophilic graphs. In this work, we focus on addressing limitation enable Attention (GAT), a commonly used variant GNNs, to explore structural information within each graph locality. Inspired by positional encoding Transformers, propose framework, termed Attentional Positional Embeddings (GAT-POS), enhance GATs embeddings capture graph. learned model predictive context, plugged into an enhanced GAT architecture, is able leverage both content node. The trained jointly optimize for task as well predicting context. Experimental results show that GAT-POS reaches remarkable improvement compared strong GNN baselines recent embedding

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Graph Attention Networks

We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations. By stacking layers in which nodes are able to attend over their neighborhoods’ features, we enable (implicitly) specifying different weight...

متن کامل

Watch Your Step: Learning Graph Embeddings Through Attention

Graph embedding methods represent nodes in a continuous vector space, preserving information from the graph (e.g. by sampling random walks). There are many hyper-parameters to these methods (such as random walk length) which have to be manually tuned for every graph. In this paper, we replace random walk hyperparameters with trainable parameters that we automatically learn via backpropagation. ...

متن کامل

Crossing Minimization within Graph Embeddings Crossing Minimization within Graph Embeddings

We propose a novel optimization-based approach to embedding heterogeneous high-dimensional data characterized by a graph. The goal is to create a two-dimensional visualization of the graph structure such that edge-crossings are minimized while preserving proximity relations between nodes. This paper provides a fundamentally new approach for addressing the crossing minimization criteria that exp...

متن کامل

Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks

Celebrated Sequence to Sequence learning (Seq2Seq) and its fruitful variants are powerful models to achieve excellent performance on the tasks that map sequences to sequences. However, these are many machine learning tasks with inputs naturally represented in a form of graphs, which imposes significant challenges to existing Seq2Seq models for lossless conversion from its graph form to the sequ...

متن کامل

Revisiting Semi-Supervised Learning with Graph Embeddings

We present a semi-supervised learning framework based on graph embeddings. Given a graph between instances, we train an embedding for each instance to jointly predict the class label and the neighborhood context in the graph. We develop both transductive and inductive variants of our method. In the transductive variant of our method, the class labels are determined by both the learned embedding...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2021

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-030-75762-5_41